Skip to main content

GCP Billing Export Integration with nOps

FAQs

Expand FAQs

1. What if I don’t see my billing data in BigQuery?

  • Ensure that Billing Export is enabled for the correct project.
  • Check if the dataset and table names match what was configured in nOps.
  • Wait up to 48 hours for the first full dataset to appear.

2. Can I use customer-managed encryption keys?

No, Google Cloud only supports Google-managed encryption for billing data exports.

3. How often is billing data updated on nOps?

Billing data is refreshed daily, but there might be a 48-hour delay for full updates.

4. How long does it take for the Pricing, Detailed Resource, and CUD tables to be available?

The tables under the Detailed Usage Cost, Pricing, and Committed Use Discounts exports may take up to 48 hours to be created.

5. Do all exports need to be in the same location?

Yes. The Detailed Usage Cost, Pricing, and Committed Use Discounts exports must be configured to use the same location (US or EU) and preferably the same project. Using different locations will cause integration issues with nOps.

6. Why do I need to grant permissions at three different levels (Organization, Billing Account, Project)?

Each level provides different types of access:

  • Organization-level roles allow nOps to discover and analyze resources across all projects in your organization (e.g., Compute instances, GKE clusters, recommendations).
  • Billing Account-level role is required to access billing metadata and currency information. This cannot be granted at the project level.
  • Project-level roles provide access to the BigQuery dataset containing your cost export data.

Granting roles only at one level will result in incomplete data or integration failures.

Setting Up the Integration

To integrate GCP Billing Exports with nOps, follow these steps:


1. Configure GCP Billing Exports

Before connecting to nOps, you must enable billing exports in GCP.

Step 1: Select or Create a GCP Project

A Google Cloud project is required to store billing data. If managing multiple billing accounts, configure the export for each account separately.

note

For help setting up the Cloud Billing account, refer to this documentation.


Step 2: Enable Cloud Billing Exports to BigQuery

You must enable three specific exports: Detailed Usage Cost, Pricing, and Committed Use Discounts. You can create the BigQuery dataset directly within the first export configuration.

  1. Go to the Billing section in the GCP Console and select Billing Export from the left-hand menu.
  2. Configure Detailed Usage Cost Export:
    • Click Edit Settings under Detailed Usage Cost.
    • Select your Project.
    • Create a New Dataset (e.g., gcp_billing_exports) or select an existing one.
    • Choose a location (US or EU) and click Save.
  3. Configure Pricing Export:
    • Click Edit Settings under Pricing.
    • Select the same Project and Dataset as above.
    • Click Save.
  4. Configure Committed Use Discounts Export:
    • Click Edit Settings under Committed Use Discounts.
    • Select the same Project.
    • Enter a dataset name (e.g., cudsExportDataset) and select the same location (Multi-region US or EU).
    • Click Save.

GIF showing configuration of all three exports

Important Configuration Details
  • CUD Export Status: After creating the Committed Use Discounts export, it may initially appear as Disabled. Simply refresh the page, and it should show as Active.
  • Location Consistency: All three exports must use the same location (US or EU) and preferably the same project. Using different locations will cause integration issues.
  • Table Creation: Each export creates a separate BigQuery table with its own unique Table ID. These tables may take up to 48 hours to be created.

Once billing exports are configured in GCP, connect them to nOps.

  1. Log in to nOps and go to Organization Settings.
  2. Navigate to Inform Integrations.
  3. Select GCP Billing Export from the available integrations.
  4. Enter the required information:
    • GCP Billing Account ID
    • BigQuery Table ID for Detailed Usage Cost
    • BigQuery Table ID for Pricing Export
    • BigQuery Table ID for Committed Use Discounts (if configured)
  5. Click Create Integration.
Finding Your IDs

Billing Account ID: Found on the Billing page (Format: 0115B9-C18400-A979DC)

BigQuery Table IDs: The easiest way to find the correct Table IDs is via the Billing Export page:

  1. Go to Billing > Billing Export.
  2. Click the Dataset name link for each export (Detailed Usage Cost, Pricing, CUDs).
  3. This opens BigQuery with the correct dataset selected.
  4. Click on the Table inside the dataset (e.g., gcp_billing_export_resource_v1_...).
  5. In the Details tab, copy the Table ID field.

Note: All exports must use different Table IDs, even if they share the same dataset.

GIF showing linking of gcp billing data to nOps

After completing this step, nOps will generate a service account email. Copy this email address — you will need it in the next section to grant permissions.

nOps Service Account Email


3. Grant Service Account Permissions in GCP

Once the nOps service account email is generated, grant it the required permissions. Permissions must be granted at three different levels: Organization, Billing Account, and Project.

Important

The nOps service account requires permissions at multiple levels. Granting roles only at the project level will not provide sufficient access for full cost visibility and recommendations.


A. Organization-Level Roles

These roles provide visibility across your entire GCP organization for asset discovery, recommendations, and resource enumeration.

Step 1: Navigate to Organization IAM

  1. Go to IAM & Admin → IAM in the Google Cloud Console.
  2. At the top of the page, use the project/organization selector to switch to your Organization (not a specific project).

Step 2: Grant Access to nOps Service Account

  1. Click + Grant Access.
  2. In the New principals field, enter the nOps service account email (obtained from the integration setup).
  3. Add the following roles:
    • Cloud Asset Viewer (roles/cloudasset.viewer) – To enumerate assets across services for correlation.
    • Browser (roles/browser) – To enumerate projects and folders.
    • Recommender Viewer (roles/recommender.viewer) – To read cost recommendations (e.g., rightsizing, idle resources).
    • Logs Viewer (roles/logging.viewer) – To read logs for resource analysis.
    • Compute Viewer (roles/compute.viewer) – To read Compute Engine data (CUDs, instances, regions).
    • Kubernetes Engine Viewer (roles/container.viewer) – To read GKE clusters, node pools, and Kubernetes resources.
    • Cloud SQL Viewer (roles/cloudsql.viewer) – To read Cloud SQL instances and configurations.
    • Cloud Run Viewer (roles/run.viewer) – To read Cloud Run services and configurations.
  4. Click Save.

Granting Organization IAM Roles


B. Billing Account-Level Role

This role is required for currency and billing metadata validation. It must be granted directly on the Billing Account.

note

Permissions granted in the standard Project IAM page do not propagate to the Billing Account. You must grant this role via the Billing console.

Step 1: Navigate to Billing Account Management

  1. Go to Billing in the Google Cloud Console.

  2. From the billing accounts list, click on the billing account name that you configured the exports for.

  3. Click Account Management in the left-hand menu.

Step 2: Open the Info Panel

  1. On the Account Management page, click Show info panel in the top-right corner (if the panel is not already visible).
  2. This will open a side panel titled with your billing account name.

Step 3: Grant Billing Account Viewer Role

  1. In the info panel on the right, click + Add Principal.
  2. In the New principals field, enter the nOps service account email.
  3. Select the role: Billing Account Viewer (roles/billing.viewer).
  4. Click Save.

Granting Billing Acount IAM


C. Dataset-Level Roles (Cost Export Datasets)

These roles are required for reading BigQuery billing data and must be granted on each of the three datasets that contain your cost export data:

  • Detailed Usage Cost dataset
  • Pricing dataset
  • Committed Use Discounts dataset

You will repeat the following steps for each dataset.

Step 1: Navigate to Billing Export

  1. Go to Billing in the Google Cloud Console.
  2. From the billing accounts list, click on the billing account name that you configured the exports for.
  3. Select Billing Export from the left-hand menu.
  4. On the Billing Export page, you'll see the Dataset name links for your three configured exports:
    • Detailed Usage Cost export dataset
    • Pricing export dataset
    • Committed Use Discounts export dataset

Step 2: Open Dataset and Manage Permissions

  1. Click on the Dataset name link for the first export (e.g., Detailed Usage Cost). This will open BigQuery with that dataset selected.
  2. In BigQuery, with the dataset selected, click the Share button at the top.
  3. Select Manage permissions.

Step 3: Grant BigQuery Data Viewer Role

  1. In the permissions panel, click Add Principal.
  2. In the New principals field, enter the nOps service account email.
  3. In the Select a role dropdown, choose BigQuery Data Viewer (roles/bigquery.dataViewer).
  4. Click Save.

Step 4: Repeat for Remaining Datasets

Go back to the Billing Export page and repeat Steps 2-3 for the other two datasets:

  • Pricing export dataset
  • Committed Use Discounts export dataset
note

If you configured all your exports to use the same dataset, you only need to grant the permissions once on that shared dataset.

Dataset Roles

Step 5: Grant Project-Level BigQuery Resource Viewer Role

The BigQuery Resource Viewer role must be granted at the project level to read reservation and slot information.

  1. Go to IAM & Admin → IAM in the Google Cloud Console.
  2. Select the project that contains your BigQuery billing export dataset.
  3. Click + Grant Access.
  4. In the New principals field, enter the nOps service account email.
  5. Select the role: BigQuery Resource Viewer (roles/bigquery.resourceViewer).
  6. Click Save.

Bigquery Resource Viewer


4. Enable Required APIs

The following APIs must be enabled in ALL projects linked to your billing account, not just the project hosting the cost export. This ensures nOps can collect comprehensive cost and usage data across your entire environment.

Why do we need these APIs? Read our API Requirements & Justification guide to understand the purpose of each API and cost implications.

You can choose between an automated Terraform approach (Option A) or manual configuration (Option B).

For organizations with many projects, we strongly recommend using the nOps GCP API Enablement Terraform Module. This module programmatically enables all required APIs (Cloud Asset, Billing, Recommender, GKE, BigQuery Reservation) across your entire organization, ensuring consistency and saving significant time compared to manual enablement.

See the Terraform Setup Guide


Option B: Manual Steps

If you prefer to enable APIs manually, follow these steps for every project in your organization.

Step 1: Open APIs & Services

  1. Go to APIs & Services → Library in the Google Cloud Console.
  2. Select a project from the top dropdown menu.

Step 2: Enable Required APIs

Search for and enable each of the following APIs:

  1. Recommender API – For cost optimization recommendations.
  2. Cloud Asset API – For asset inventory and discovery.
  3. Kubernetes Engine API – For GKE cluster data.
  4. BigQuery Reservation API – For BigQuery capacity commitments.
  5. Cloud Billing API – For billing account access.

For each API:

  1. Search for the API name in the search bar.
  2. Click on the API in the results.
  3. Click Enable.

Why Use nOps for GCP Cost Management?

  • Unified Cost Visibility – View all cloud spend, including AWS, Azure, and GCP, in a single platform.
  • Automated Cost Analysis – Identify inefficiencies and optimize resource allocation.
  • Custom Reporting – Create tailored reports for detailed GCP spend analysis.

By integrating GCP Billing Exports with Inform, you gain deeper visibility into cloud costs, empowering smarter budget decisions.